Conic Geometric Optimization on the Manifold of Positive Definite Matrices

نویسندگان

  • Suvrit Sra
  • Reshad Hosseini
چکیده

We develop geometric optimisation on the manifold of hermitian positive definite (hpd) matrices. In particular, we consider optimising two types of cost functions: (i) geodesically convex (g-convex); and (ii) log-nonexpansive (LN). G-convex functions are nonconvex in the usual euclidean sense, but convex along the manifold and thus allow global optimisation. LN functions may fail to be even g-convex, but still remain globally optimisable due to their special structure. We develop theoretical tools to recognise and generate g-convex functions as well as cone theoretic fixed-point optimisation algorithms. We illustrate our techniques by applying them to maximum-likelihood parameter estimation for elliptically contoured distributions (a rich class that substantially generalises the multivariate normal distribution). We compare our fixed-point algorithms with sophisticated manifold optimisation methods and observe obtain notable speedups.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Geometric optimisation on positive definite matrices for elliptically contoured distributions

Hermitian positive definite (hpd) matrices recur throughout machine learning, statistics, and optimisation. This paper develops (conic) geometric optimisation on the cone of hpd matrices, which allows us to globally optimise a large class of nonconvex functions of hpd matrices. Specifically, we first use the Riemannian manifold structure of the hpd cone for studying functions that are nonconvex...

متن کامل

The Riemannian Barzilai-borwein Method with Nonmonotone Line-search and the Matrix Geometric Mean Computation

The Barzilai-Borwein method, an effective gradient descent method with cleaver choice of the step-length, is adapted from nonlinear optimization to Riemannian manifold optimization. More generally, global convergence of a nonmonotone line-search strategy for Riemannian optimization algorithms is proved under some standard assumptions. By a set of numerical tests, the Riemannian Barzilai-Borwein...

متن کامل

A Riemannian quasi-Newton method for computing the Karcher mean of symmetric positive definite matrices

This paper tackles the problem of computing the Karcher mean of a collection of symmetric positive-definite matrices. We present a concrete limited-memory Riemannian BFGS method to handle this computational task. We also provide methods to produce efficient numerical representations of geometric objects on the manifold of symmetric positive-definite matrices that are required for Riemannian opt...

متن کامل

Geometric Optimization in Machine Learning

Machine learning models often rely on sparsity, low-rank, orthogonality, correlation, or graphical structure. The structure of interest in this chapter is geometric, specifically the manifold of positive definite (PD) matrices. Though these matrices recur throughout the applied sciences, our focus is on more recent developments in machine learning and optimization. In particular, we study (i) m...

متن کامل

Computing the Matrix Geometric Mean of Two HPD Matrices: A Stable Iterative Method

A new iteration scheme for computing the sign of a matrix which has no pure imaginary eigenvalues is presented. Then, by applying a well-known identity in matrix functions theory, an algorithm for computing the geometric mean of two Hermitian positive definite matrices is constructed. Moreover, another efficient algorithm for this purpose is derived free from the computation of principal matrix...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 25  شماره 

صفحات  -

تاریخ انتشار 2015